Skip to content

fix: add required max_tokens argument to Anthropic client#436

Merged
DanielRyanSmith merged 1 commit intomainfrom
fix/anthropic-max-tokens
Apr 16, 2026
Merged

fix: add required max_tokens argument to Anthropic client#436
DanielRyanSmith merged 1 commit intomainfrom
fix/anthropic-max-tokens

Conversation

@DanielRyanSmith
Copy link
Copy Markdown
Collaborator

Overview

This PR resolves the missing argument error encountered when generating tests using the Anthropic Claude model.

Root Cause / Motivation

The latest version of the Anthropic SDK strictly requires the max_tokens argument when using the messages.create() method for Claude models. Previously, this argument was omitted, causing a Missing required arguments exception during the requirements extraction phase of test generation.

Detailed Changelog

  • wptgen/llm.py: Added a DEFAULT_MAX_TOKENS = 8192 class constant to AnthropicClient. Included this constant as the max_tokens value in the kwargs dictionary passed to messages.create().
  • tests/test_anthropic.py: Updated the mock assertion in test_anthropic_generate_content to expect max_tokens=AnthropicClient.DEFAULT_MAX_TOKENS.

The Anthropic SDK requires the `max_tokens` argument when creating messages with Claude models. This commit adds a `DEFAULT_MAX_TOKENS` class constant to `AnthropicClient` and includes it in the kwargs for `generate_content`. Tests have been updated to reflect this change.
@DanielRyanSmith DanielRyanSmith changed the title fix: add required max_tokens argument to Anthropic client fix: add required max_tokens argument to Anthropic client Apr 10, 2026
@DanielRyanSmith DanielRyanSmith requested a review from szy196 April 16, 2026 17:47
@DanielRyanSmith DanielRyanSmith merged commit 6113a3d into main Apr 16, 2026
9 checks passed
@DanielRyanSmith DanielRyanSmith deleted the fix/anthropic-max-tokens branch April 16, 2026 19:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants